programming4us
           
 
 
Programming

DirectX 10 Game Programming : Shaders and Effects - Effect Files

- Free product key for windows 10
- Free Product Key for Microsoft office 365
- Malwarebytes Premium 3.7.1 Serial Keys (LifeTime) 2019
7/9/2013 3:30:39 AM
While using shaders individually is still possible with Direct3D10, you’ll find them extremely useful when grouped together into an effect. An effect is a simple way of packaging the needed vertex, pixel, and geometry shaders together to render objects in a particular way. The effect is loaded as a single object and the included shaders are executed when necessary. By changing the effect you’re applying to your scene, you easily change the method Direct3D is using to do the rendering. Effects are defined with an effect file: a text format that is loaded in from disk, compiled, and executed.

Effect File Layout

Effect files are a way of containing a particular set of rendering functionality. Each effect, applied when drawing objects in your scene, dictates what the objects look like and how they’re drawn. For example, you may create an effect whose job it is to texture objects; or you may create an effect to generate lighting bloom or blur. Effects have an amazing versatility in how they can be used.

Previously, vertex and pixel shaders were loaded and applied separately. Effects combine the shaders into a self-contained unit that encompasses functionality of multiple shader types.

Effects are comprised of a couple of different sections:

  • External variables— Variables that get their data from the calling program.

  • Input structures— Structures that define the information being passed between shaders. For example, information output from a vertex shader and passed as input into the pixel shader.

  • Vertex shader— Portion of the effect file that handles processing of vertices.

  • Pixel shader— Portion of the effect file that handles pixels.

  • Technique block(s)— Defines the shaders and passes available within the effect.

There are other sections possible within an effect file, such as a geometry shader and a texture sampler. These will be discussed in other sections.

Below you’ll find an example effect that contains all the necessary parts for a valid effect.

// constant buffer of external variables
cbuffer Variables
{
    matrix Projection;
};

// PS_INPUT - input variables to the pixel shader
// This struct is created and filled in by the vertex shader
struct PS_INPUT
{
    float4 Pos : SV_POSITION;
    float4 Color : COLOR0;
};

////////////////////////////////////////////////
// Vertex Shader - Main Function
///////////////////////////////////////////////
PS_INPUT VS(float4 Pos : POSITION, float4 Color : COLOR)
{
    PS_INPUT psInput;
    // Pass through both the position and the color
    psInput.Pos = mul(Pos, Projection);
    psInput.Color = Color;

    return psInput;
}

///////////////////////////////////////////////
// Pixel Shader
///////////////////////////////////////////////
float4 PS(PS_INPUT psInput) : SV_Target
{
    return psInput.Color;
}

// Define the technique
technique10 Render
{
    pass P0
    {
        SetVertexShader( CompileShader(vs_4_0, VS() ));
        SetGeometryShader(NULL);
        SetPixelShader( CompileShader(ps_4_0, PS() ));
    }
}

					  

Loading an Effect File

Effect files are loaded using the D3DX10CreateEffectFromFile function. When loading most effect files, it is possible to pass this function a series of default parameters causing it to just load the single effect file from the path you specify in the first parameter.

The key parameters to this function are of course the path and file name to the effect file to load, as well as the D3D10_SHADER_ENABLE_STRICTNESS flag. This flag tells the shader compiler to only accept valid syntax and warns you when attempting to use anything that is deprecated. The load function also requires you to pass in a pointer to the current Direct3D device and an object of type ID3D10Effect where it can store the new created effect object.

// The effect object
ID3D10Effect* pEffect = NULL;

// Load the effect file and create the effect object
HRESULT hr = D3DX10CreateEffectFromFile (L"..\\simple.fx", // filename
    NULL,
    NULL,
    "fx_4_0", // shader version
    D3D10_SHADER_ENABLE_STRICTNESS,
    0,
    pD3DDevice, // The Direct3D device
    NULL,
    NULL,
    &pEffect,
    NULL,
    NULL);

if (FAILED(hr))
{
    return false;
}

If you require more advanced functionality, please review the DirectX SDK documentation for this function.

External Variables and Constant Buffers

Most effects will need additional input past just the list of vertices; this is where external variables are useful. External variables are those variables declared within your effects that are visible from within your application code. Variables that receive information like current frame time, world projection, or light positions can be declared within the effect so they can be updated from the calling program.

With the introduction of Direct3D10, all external variables now reside in constant buffers. Constant buffers are used to group variables visible to the calling program so that they can be optimized for access. Constant buffers are similar in definition to structures and are created using the cbuffer keyword.

cbuffer Variables
{
    matrix Projection;
};

Constant buffers are commonly declared at the top of an effect file and reside outside of any other section. For ease of use, it can be useful to group together variables based on the amount they are accessed. For instance, variables that get an initial value would be grouped separately from variables that are updated on a frame by frame basis. You have the ability to create multiple constant buffers.

When the effect file is loaded, you can bind the external variables to the effect variables within your application. The following code shows how the external variable “Projection” is bound to the ID3D10EffectMatrixVariable in the application.

// declare the effect variable
ID3D10EffectMatrixVariable* pProjectionMatrixVariable = NULL;
// bind the effect variable to the external variable in the effect file
pProjectionMatrixVariable = modelObject->pEffect-
>GetVariableByName ("Projection")->AsMatrix();
// update the effect variable with the correct data
pProjectionMatrixVariable->SetMatrix((float*)&finalMatrix);

Input and Output Structures

Effect files consistently need to pass multiple values between shaders; to keep things simple, the variables are passed within a structure. The structure allows for more than one variable to be bundled together into an easy to send package and helps to minimize the work needed when adding a new variable.

For instance, vertex shaders commonly need to pass values like vertex position, color, or normal value along to the pixel shader. Since the vertex shader has the limitation of a single return value, it simply packages the needed variables into the structure and sends it to the pixel shader. The pixel shader then accesses the variables within the structure. An example structure called PS_INPUT is shown next.

// PS_INPUT - input variables to the pixel shader
// This struct is created and filled in by the vertex shader
struct PS_INPUT
{
    float4 Pos : SV_POSITION;
    float4 Color : COLOR0;
};

Using the structures is simple. First, an instance of the structure is created within the vertex shader. Next, the individual structure variables are filled out and then the structure is returned. The next shader in the pipeline will use the PS_INPUT structure as its input and have access to the variables you set. A simple vertex shader is shown here to demonstrate the definition and usage of a structure.

////////////////////////////////////////////////
// Vertex Shader - Main Function
///////////////////////////////////////////////
PS_INPUT VS(float4 Pos : POSITION, float4 Color : COLOR)
{
    PS_INPUT psInput;

    // Pass through both the position and the color
    psInput.Pos = mul( Pos, Projection );
    psInput.Color = Color;

    return psInput;
}

Technique Blocks

Effect files combine the functionality of multiple shaders into a single block called a technique. Techniques are a way to define how something should be drawn. For instance, you can define a technique that supports translucency as well as an opaque technique. By switching between techniques, the objects being drawn will go from solid to see through.

Techniques are defined within a shader using the technique10 keyword followed by the name of the technique being created.

technique10 Render
{
    // technique definition
}

Because you can create simple or complex rendering techniques, techniques apply their functionality in passes. Each pass updates or changes the render state and shaders being applied to the scene. Because not all the effects you come up with can be applied in a single pass, techniques give you the ability to define more than one. Some post processing effects such as depth of field require more than one pass.

Passes

Each pass is created using the pass keyword followed by its pass level. The pass level is a combination of the letter P followed by the number of the pass.

In the following example, there are two passes, P0 and P1, being defined. At least one pass must be defined for the technique to be valid.

technique10 Render
{
    pass P0
    {
        // pass shader definitions
    }

    pass P1
    {
        // pass shader definitions
    }
}

Setting the Shaders in a Pass

The main job of each pass is the setting of the three types of shaders: the vertex shader, the geometry shader, and the pixel shader. Because the shaders you use can differ for each pass, they must be specifically defined using the functions SetVertexShader, SetGeometryShader, and SetPixelShader.

technique10 Render
{
    pass P0
    {
        // Define the vertex shader for this pass
        SetVertexShader( CompileShader(vs_4_0, VS() ));
        // No Geometry shader needed, pass NULL
        SetGeometryShader(NULL);
        // Define the pixel shader for this pass
        SetPixelShader( CompileShader(ps_4_0, PS() ));
    }
}

Note

In a pass definition, vertex and pixel shaders are required, while geometry shaders are optional.


As you can see, the shader setting functions include a call to the function CompileShader.

CompileShader( shader target, shader function )

CompileShader takes the shader you defined within the effect file and converts it into a format usable by Direct3D. CompileShader takes two parameters, the shader target value and the name of the shader function.

The shader target value specifies the shader level to use when compiling. Direct3D10 supports shader model 4.0 so the value vs_4_0 and ps_4_0 are used.

In the previous example, the vertex shader is using the function VS() while the pixel shader is using PS(). The names of these functions can change to suit your needs. Both of the shaders are being set to use shader model 4.0.

Other -----------------
- Programming Windows Services with Microsoft Visual Basic 2008 : Services and Polling - Updating the Service Events
- Programming Windows Services with Microsoft Visual Basic 2008 : Services and Polling - Adding a Module File, Adding New Polling Code
- Microsoft Visual Studio 2010 : Using the Concurrency Visualizer (part 3) - The Cores View
- Microsoft Visual Studio 2010 : Using the Concurrency Visualizer (part 2) - CPU Utilization View, The Threads View
- Microsoft Visual Studio 2010 : Using the Concurrency Visualizer (part 1)
- Microsoft Visual Studio 2010 : Reports and Debugging - Using the Parallel Stacks Window
- Microsoft Visual Studio 2010 : Reports and Debugging - Using the Parallel Tasks Window
- Microsoft Visual Studio 2010 : Debugging with Visual Studio 2010 (part 2) - Debugging Threads
- Microsoft Visual Studio 2010 : Debugging with Visual Studio 2010 (part 1) - Live Debugging, Performing Post-Mortem Analysis
- .NET Components : Serialization and Class Hierarchies (part 2) - Manual Base-Class Serialization
- .NET Components : Serialization and Class Hierarchies (part 1) - Custom Serialization and Base Classes
- .NET Components : Custom Serialization (part 2) - Constraining Serialization
- .NET Components : Custom Serialization (part 1) - The ISerializable Interface, Implementing ISerializable
- .NET Components : Serialization and Streams - Serializing Multiple Objects
- Microsoft ASP.NET 3.5 : Writing HTTP Handlers (part 5) - Advanced HTTP Handler Programming
- Microsoft ASP.NET 3.5 : Writing HTTP Handlers (part 4) - Serving Images More Effectively
- Microsoft ASP.NET 3.5 : Writing HTTP Handlers (part 3) - The Picture Viewer Handler
- Microsoft ASP.NET 3.5 : Writing HTTP Handlers (part 2) - An HTTP Handler for Quick Data Reports
- Microsoft ASP.NET 3.5 : Writing HTTP Handlers (part 1) - The IHttpHandler Interface
- Microsoft ASP.NET 3.5 : HTTP Handlers and Modules - Quick Overview of the IIS Extensibility API
 
 
 
Top 10
 
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Finding containers and lists in Visio (part 2) - Wireframes,Legends
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Finding containers and lists in Visio (part 1) - Swimlanes
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Formatting and sizing lists
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Adding shapes to lists
- Microsoft Visio 2013 : Adding Structure to Your Diagrams - Sizing containers
- Microsoft Access 2010 : Control Properties and Why to Use Them (part 3) - The Other Properties of a Control
- Microsoft Access 2010 : Control Properties and Why to Use Them (part 2) - The Data Properties of a Control
- Microsoft Access 2010 : Control Properties and Why to Use Them (part 1) - The Format Properties of a Control
- Microsoft Access 2010 : Form Properties and Why Should You Use Them - Working with the Properties Window
- Microsoft Visio 2013 : Using the Organization Chart Wizard with new data
- First look: Apple Watch

- 3 Tips for Maintaining Your Cell Phone Battery (part 1)

- 3 Tips for Maintaining Your Cell Phone Battery (part 2)
programming4us programming4us